On the Convergence of MDL Density Estimation

نویسنده

  • Tong Zhang
چکیده

We present a general information exponential inequality that measures the statistical complexity of some deterministic and randomized density estimators. Using this inequality, we are able to improve classical results concerning the convergence of two-part code MDL in [1]. Moreover, we are able to derive clean finite-sample convergence bounds that are not obtainable using previous approaches.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

From -entropy to KL-entropy: Analysis of Minimum Information Complexity Density Estimation

We consider an extension of -entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this techniq...

متن کامل

Wavelet Based Estimation of the Derivatives of a Density for a Discrete-Time Stochastic Process: Lp-Losses

We propose a method of estimation of the derivatives of probability density based on wavelets methods for a sequence of random variables with a common one-dimensional probability density function and obtain an upper bound on Lp-losses for such estimators. We suppose that the process is strongly mixing and we show that the rate of convergence essentially depends on the behavior of a special quad...

متن کامل

Linear Wavelet-Based Estimation for Derivative of a Density under Random Censorship

In this paper we consider estimation of the derivative of a density based on wavelets methods using randomly right censored data. We extend the results regarding the asymptotic convergence rates due to Prakasa Rao (1996) and Chaubey et al. (2008) under random censorship model. Our treatment is facilitated by results of Stute (1995) and Li (2003) that enable us in demonstrating that the same con...

متن کامل

MDL Histogram Density Estimation

We regard histogram density estimation as a model selection problem. Our approach is based on the information-theoretic minimum description length (MDL) principle, which can be applied for tasks such as data clustering, density estimation, image denoising and model selection in general. MDLbased model selection is formalized via the normalized maximum likelihood (NML) distribution, which has se...

متن کامل

Exact Minimax Predictive Density Estimation and MDL

The problems of predictive density estimation with Kullback-Leibler loss, optimal universal data compression for MDL model selection, and the choice of priors for Bayes factors in model selection are interrelated. Research in recent years has identified procedures which are minimax for risk in predictive density estimation and for redundancy in universal data compression. Here, after reviewing ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004